28 research outputs found

    CDS-MIP: CDS-based Multiple Itineraries Planning for mobile agents in wireless sensor network

    Get PDF
    using multi agents in the wireless sensor networks (WSNs) for aggregating data has gained significant attention. Planning the optimal itinerary of the mobile agent is an essential step before the process of data gathering. Many approaches have been proposed to solve the problem of planning MAs itineraries, but all of those approaches are assuming that the MAs visit all SNs and large number of intermediate nodes. This assumption imposed a burden; the size of agent increases with the increase in the visited SNs, therefore consume more energy and spend more time in its migration. None of those proposed approaches takes into account the significant role that the connected dominating nodes play as virtual infrastructure in such wireless sensor networks WSNs. This article introduces a novel energy-efficient itinerary planning algorithmic approach based on the minimum connected dominating sets (CDSs) for multi-agents dedicated in data gathering process. In our proposed approach, instead of planning the itineraries over all sensor nodes SNs, we plan the itineraries among subsets of the MCDS in each cluster. Thus, no need to move the agent in all the SNs, and the intermediate nodes (if any) in each itinerary will be few. Simulation results have demonstrated that our approach is more efficient than other approaches in terms of overall energy consumption and task execution time

    Adaptive traffic lights based on traffic flow prediction using machine learning models

    Get PDF
    Traffic congestion prediction is one of the essential components of intelligent transport systems (ITS). This is due to the rapid growth of population and, consequently, the high number of vehicles in cities. Nowadays, the problem of traffic congestion attracts more and more attention from researchers in the field of ITS. Traffic congestion can be predicted in advance by analyzing traffic flow data. In this article, we used machine learning algorithms such as linear regression, random forest regressor, decision tree regressor, gradient boosting regressor, and K-neighbor regressor to predict traffic flow and reduce traffic congestion at intersections. We used the public roads dataset from the UK national road traffic to test our models. All machine learning algorithms obtained good performance metrics, indicating that they are valid for implementation in smart traffic light systems. Next, we implemented an adaptive traffic light system based on a random forest regressor model, which adjusts the timing of green and red lights depending on the road width, traffic density, types of vehicles, and expected traffic. Simulations of the proposed system show a 30.8% reduction in traffic congestion, thus justifying its effectiveness and the interest of deploying it to regulate the signaling problem in intersections

    An Optimizing Approach for Multi Constraints Reassignment Problem of Human Resources

    Get PDF
    This paper presents an effective approach to optimize the reassignment of Human Resources in the enterprise that is formed by several units of productions to take into consideration the human characteristics. This approach consists of two steps; the first step is to formalize the studied problem that is practically take the form of the generalized assignment problem (GAP) known as NP-hard problem. Additionally, the variables in the formulation of our problem are interlinked by certain constraints. These two proprieties can to justify the important complexity of this problem. The second step is focused to solve this complex problem by using the genetic algorithm. We present the experimentally result for justifying the validity of the proposed approach. So, the solution obtained allowed us to get an optimal assignment of personnel that can lead to improve the average productivity or ratability or at least ensure its equilibration within sites of enterprise

    Harnessing Machine Learning and Multi Agent Systems for Health Crisis Analysis in North Africa

    Get PDF
    The COVID-19 pandemic has presented a significant global health challenge, including in Morocco. These actions had direct repercussions on the economy as well as essential institutions in society; however, there were also indirect effects from these changes. This article focuses on these indirect consequences on the environment’s sustainability. It demonstrates that the net effect has been good in terms of reduced carbon gases, oil exploration operations, and pollution. This study introduces a novel approach to predicting and simulating the pandemic’s dynamics in Morocco using machine learning and multi-agent system models. We collected and processed daily data on COVID-19 cases, deaths, and interventions in Morocco from March 2, 2020, to June 30, 2021. We developed and validated several machine learning models, including decision trees, random forests, and support vector machines, to predict daily COVID-19 cases and deaths. Additionally, we designed a multi-agent system model to simulate the interactions among individuals, social groups, and the government in response to the pandemic, using agent-based modelling and game theory. Our results indicate that the machine learning models achieved high accuracy and generalization performance, with an average R-squared value of 0.83 for the cases and 0.90 for the deaths. The multi-agent simulations reveal the complex dynamics and trade-offs among pandemic control measures, economic activity, and social welfare in Morocco, suggesting that a coordinated and adaptive approach is necessary to balance these factors. Our study contributes to the growing literature on using machine learning and multiagent systems for pandemic prediction and management, providing valuable insights and recommendations for policymakers and public health officials in Morocco and beyond

    Comparative study of the Security Analysis of IoT systems using attack trees algorithm

    Get PDF
    The Internet of Things (IoT) is a rapidly evolving environment that allows users to use and control a wide variety of connected objects. The 20 billion IoT devices that will be employed by 2020 are only the top of the iceberg. According to IDC, the overall amount of connected devices will rise to 41.6 billion over the next five years, producing over 80 Zettabytes of data by 2025 which will impact environment severely. These connected environments increase the attack surface of a system since; the risks are multiplied by the number of connected devices. These devices are responsible for more or less critical tasks, and can therefore be the target of users malicious, in this paper we present a methodology to evaluate the security of IoT systems. We propose a way to represent IoT systems, coupled with attack trees in order to assess the chances of success of an attack on a given system

    A Breast Cancer Detection Problem using various Machine Learning Techniques in the Context of Health Prediction System

    Get PDF
    Today, breast cancer is one of the most common diseases that can cause certain complications, sometimes worst-case scenario is death. Thus, there is an urgent need for a diagnosis tool that can help doctors detect the disease at an early stage and recommend the necessary lifestyle changes to stop the progression of the disease; the likelihood of developing cancer at a young age has also been greatly increased by environmental changes in our everyday lives. Machine learning is an urgent need today to enhance human effort and offer higher automation with fewer errors. In this article, a breast cancer detection and prediction system is developed based on machine learning models (SVM, NB, AdaBoost). The achieved accuracies of the developed models are as follows: SVM achieved an overall score of 98.82%, NB achieved an overall score of 97.71%, and finally, AdaBoost achieved an overall score of 97.71%

    DNA technology for big data storage and error detection solutions: Hamming code vs Cyclic Redundancy Check (CRC)

    Get PDF
    There is an increasing need for high-capacity, highdensity storage media that can retain data for a long time, due to the exponential development in the capacity of information generated. The durability and high information density of synthetic deoxyribonucleic acid (DNA) make it an attractive and promising medium for data storage. DNA data storage technology is expected to revolutionize data storage in the coming years, replacing various Big Data storage technologies. As a medium that addresses the need for high-latency, immutable information storage, DNA has several potential advantages. One of the key advantages of DNA storage is its extraordinary density. Theoretically, a gram of DNA can encode 455 exabytes, or 2 bits per nucleotide. Unlike other digital storage media, synthetic DNA enables large quantities of data to be stored in a biological medium. This reduces the need for traditional storage media such as hard disks, which consume energy and require materials such as plastic or metals, and also often leads to the generation of electronic waste when they become obsolete or damaged. Additionally, although DNA degrades over thousands of years under non-ideal conditions, it is generally readable. Furthermore, as DNA possesses natural reading and writing enzymes as part of its biological functions, it is expected to remain the standard for data retrieval in the foreseeable future. However, the high error rate poses a significant challenge for DNA-based information coding strategies. Currently, it is impossible to execute DNA strand synthesis, amplification, or sequencing errors-free. In order to utilize synthetic DNA as a storage medium for digital data, specialized systems and solutions for direct error detection and correction must be implemented. The goal of this paper is to introduce DNA storage technology, outline the benefits and added value of this approach, and present an experiment comparing the effectiveness of two error detection and correction codes (Hamming and CRC) used in the DNA data storage strategy

    DETECTION AND RECOGNITION OF ROAD SIGNS USING YOLOv5

    Get PDF
    In the field of deep learning, a convolutional neural network is a class of artificial neural networks that became dominant in various computer vision tasks, which is widely used to solve complex problems in various areas, including driver assistance systems in the auto- motive field. Convolutional neural networks overcome the limitations of others conventional machine learning approaches since they are designed to automatically and adaptively learn the spatial characteristics of features in an image. In this paper, we are going to evaluate the inference and accuracy of YOLOv5s, for effective traffic sign detection in various environments. The results generated upon five classes gives satisfaction by 63.7% for the mean average precision, and over 80% in accordance to 5 categories set in this study. This article compared to YOLOV4 based CSP-DarkNet53 using Indonesia Traffic Signs generate better precision

    Machine Learning for a Medical Prediction System “Breast Cancer Detection” as a use case

    Get PDF
    Breast cancer is a widespread and serious illness, highlighting the importance of an early detection tool that can provide prognostic information and suggest necessary lifestyle changes to prevent its advancement, also the environmental changes in our daily life have significantly enhance the chances of getting cancer at an early stage of our life. Machine learning has become an indispensable tool in addressing this pressing need, enhancing human capabilities and offering greater automation with reduced errors. In this article, a breast cancer detection and prediction system has been created, utilizing diverse machine learning models including KNN, LR, and XGBoost
    corecore